A Generalized Iterative Scaling Algorithm for Maximum Entropy Reasoning in Relational Probabilistic Conditional Logic Under Aggregation Semantics
نویسنده
چکیده
Recently, different semantics for relational probabilistic conditionals and corresponding maximum entropy (ME) inference operators have been proposed. In this paper, we study the so-called aggregation semantics that covers both notions of a statistical and subjective view. The computation of its inference operator requires the calculation of the ME-distribution satisfying all probabilistic conditionals. Since each conditional induces a linear constraint on the probability distribution, the optimization problem to solve is the calculation of the probability distribution with maximum entropy under linear constraints. We demonstrate how the well-known Generalized Iterative Scaling (GIS) algorithm technique can be applied to this optimization problem to calculate the maximum entropy distribution in an iterative way. We show how the linear constrains are transformed into normalized feature functions to meet the requirements of GIS and present a practical algorithm which is tailor-made for the computation of the ME-inference operator based on aggregation semantics. We also present a practical implementation of the developed algorithm.
منابع مشابه
On probabilistic inference in relational conditional logics
The principle of maximum entropy has proven to be a powerful approach for commonsense reasoning in probabilistic conditional logics on propositional languages. Due to this principle, reasoning is performed based on the unique model of a knowledge base that has maximum entropy. This kind of model-based inference fulfills many desirable properties for inductive inference mechanisms and is usually...
متن کاملOn Prototypical Indifference and Lifted Inference in Relational Probabilistic Conditional Logic
Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of do...
متن کاملRepresenting Statistical Information and Degrees of Belief in First-Order Probabilistic Conditional Logic
Employing maximum entropy methods on probabilistic conditional logic has proven to be a useful approach for commonsense reasoning. Yet, the expressive power of this logic and similar formalisms is limited due to their foundations on propositional logic and in the past few years a lot of proposals have been made for probabilistic reasoning in relational settings. Most of these proposals rely on ...
متن کاملUniversität Dortmund an der Fakultät für Informatik Matthias Thimm
Reasoning with inaccurate information is a major topic within the fields of artificial intelligence in general and knowledge representation and reasoning in particular. This thesis deals with information that can be incomplete, uncertain, and contradictory. We employ probabilistic conditional logic as a foundation for our investigation. This framework allows for the representation of uncertain ...
متن کاملEvolving Knowledge in Theory and Applications
Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of do...
متن کامل